Big Data Analyst
Job Description
Designing and implementing the architecture of ETL using PySpark and SQL language in AWS Glue.
Designing and developing transactional and analytical data structures.
Aligning the Data technical capabilities to business objectives.
Developing reporting solutions to support Business Intelligence, including resolving problems related to data analysis and reporting services.
Establishing and maintaining policies, operational procedures, and associated documentation for user interaction with the database environment.
Following Firstmac standards and best practice for the deployment of AWS components and code using source code controls such as BitBucket, and utilizing DevOps tools (e.g., Terraform), apps, and CLI as appropriate.
Recommending improvements to existing approaches and practices within Data Governance forums and other appropriate meetings.
Qualification
Python 3+ including Virtual environments, PIP, Pypy, Package development, Boto3, sqlalchemy, Pytest
ETL / ELT experience with SQL & Python within a big data environment.
Experience with SQL and NoSQL databases
Excellent coding skills, source control management (ie git) across programming languages with a strong focus on unit and integration testing
SQL (ANSI, TSQL, PL/pgSQL)
Experience in deploying AWS components remotely using AWS CLI Ability to perform database modelling and design, develop, debug, troubleshoot database objects, stored procedures and functions
Knowledgeable on LINUX with the ability to program shell scripts
Experience in developing and maintaining software solutions within the AWS ecosystem (S3, Redshift, DynamoDB).
About The Financial Service
The company has grown from a small family business into a leading non-bank lender, managing a diverse portfolio of loans and investments. It is self-funded through bonds and actively supports its community by offering employees paid charity days and participating in local initiatives.